OpenELM is a set of open-source efficient language models that employ a hierarchical scaling strategy to optimize parameter allocation and improve model accuracy. It includes four parameter scales: 270M, 450M, 1.1B, and 3B, offering both pre-trained and instruction-tuned versions.
Natural Language Processing
Transformers